Markov models are widely used to describe processes of stochastic dynamics.Here, we show that Markov models are a natural consequence of the dynamicalprinciple of Maximum Caliber. First, we show that when there are differentpossible dynamical trajectories in a time-homogeneous process, then the onlytype of process that maximizes the path entropy, for any given singletstatistics, is a sequence of identical, independently distributed (i.i.d.)random variables, which is the simplest Markov process. If the data is in theform of sequentially pairwise statistics, then maximizing the caliber dictatesthat the process is Markovian with a uniform initial distribution. Furthermore,if an initial non-uniform dynamical distribution is known, or multipletrajectories are conditioned on an initial state, then the Markov process isstill the only one that maximizes the caliber. Second, given a model, MaxCalcan be used to compute the parameters of that model. We show that thisprocedure is equivalent to the maximum-likelihood method of inference in thetheory of statistics.
展开▼